Skip to content

Patch validate_device in torch.nn.attention.flex_attention#33

Merged
yeahdongcn merged 2 commits intoMooreThreads:mainfrom
wenqf11:wqf/patch_validate_device
Feb 25, 2026
Merged

Patch validate_device in torch.nn.attention.flex_attention#33
yeahdongcn merged 2 commits intoMooreThreads:mainfrom
wenqf11:wqf/patch_validate_device

Conversation

@wenqf11
Copy link
Contributor

@wenqf11 wenqf11 commented Feb 14, 2026

Patch validate_device in torch.nn.attention.flex_attention

@yeahdongcn
Copy link
Collaborator

@wenqf11 Could you please add some unit tests to cover flex_attention patch? Thanks.

@yeahdongcn yeahdongcn merged commit 5cade97 into MooreThreads:main Feb 25, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants